72 research outputs found

    EXAMINING UNIVERSITY STAKEHOLDERS\u27 PERCEPTION OF THE IMPLEMENTATION OF INTERNATIONALIZATION IN HIGHER EDUCATION INSTITUTIONS IN THE UAE

    Get PDF
    The globalization of economies and societies worldwide, has brought massive transformations in the field of higher education, creating a context to include an ‘international’ dimension in higher education institutions (HEIs). Global learning is essential in the development of cognitive skills, as well as for increased success among academics, hence institutional stakeholders such as administrators, faculty, and students are key participants in initiatives to internationalize academia. The current study examines the perspectives of institutional stakeholders (top administrators, faculty, and students) concerning the process and implementation of internationalization of higher education in the UAE, revealing its potential benefits and challenges. In a mixed method study, data were collected using online questionnaires and semi-structured interviews with the institutional stakeholders from eight top-ranked institutions in the UAE. To assess the process of internationalization within the ‘internationalization cube’ framework, official documents regarding policies and strategies were sought from these institutions. The overall findings suggest that institutional stakeholders mainly view internationalization as a significant phenomenon which serves as a tool for the creation and dissemination of knowledge, ultimately to improve the quality of education. The study categorized the institutions under study based on their internationalization efforts, offering decision-makers a rich source of information for beneficial use in planning and implementation of internationalization at their institutions

    Replica Creation Algorithm for Data Grids

    Get PDF
    Data grid system is a data management infrastructure that facilitates reliable access and sharing of large amount of data, storage resources, and data transfer services that can be scaled across distributed locations. This thesis presents a new replication algorithm that improves data access performance in data grids by distributing relevant data copies around the grid. The new Data Replica Creation Algorithm (DRCM) improves performance of data grid systems by reducing job execution time and making the best use of data grid resources (network bandwidth and storage space). Current algorithms focus on number of accesses in deciding which file to replicate and where to place them, which ignores resources’ capabilities. DRCM differs by considering both user and resource perspectives; strategically placing replicas at locations that provide the lowest transfer cost. The proposed algorithm uses three strategies: Replica Creation and Deletion Strategy (RCDS), Replica Placement Strategy (RPS), and Replica Replacement Strategy (RRS). DRCM was evaluated using network simulation (OptorSim) based on selected performance metrics (mean job execution time, efficient network usage, average storage usage, and computing element usage), scenarios, and topologies. Results revealed better job execution time with lower resource consumption than existing approaches. This research contributes replication strategies embodied in one algorithm that enhances data grid performance, capable of making a decision on creating or deleting more than one file during same decision. Furthermore, dependency-level-between-files criterion was utilized and integrated with the exponential growth/decay model to give an accurate file evaluation

    The Impact Of Employee Perceptions On Organizational Commitment

    Get PDF
    Kajian ini dijalankan untuk mengkaji hubungan antara persepsi kakitangan bank-bank di Gaza, Palestin dan kesan persepsi tersebut dengan komitmen kakitangan terhadap bank-bank berkenaan. This study was conducted to investigate the relationship between the perception of the employees of the banks in Gaza, Palestine and the impact of such a perception on their commitment to these banks

    A dynamic replica creation: Which file to replicate?

    Get PDF
    Data Grid is an infrastructure that manages huge amount of data files and provides intensive computational resources across geographically distributed collaboration.To increase resource availability and to ease resource sharing in such environment, there is a need for replication services.Data replication is one of the methods used to improve the performance of data access in distributed systems.In this paper, we propose a dynamic replication strategy that is based on exponential growth or decay rate and dependency level of data files (EXPM).Simulation results (via Optorsim) show that EXPM outperformed LALW in the measured metrics – mean job execution time, effective network usage and average storage usage

    A dynamic replication strategy based on exponential growth/decay rate

    Get PDF
    Data Grid is an infrastructure that manages huge amount of data files, and provides intensive computational resources across geographically distributed collaboration.To increase resource availability and to ease resource sharing in such environment, there is a need for replication services.Data replication is one of the methods used to improve the performance of data access in distributed systems.In this paper, we include issues arising in data replication domain and also we propose a dynamic replication strategy that is based on exponential growth or decay rate. The purpose of the proposed strategy is to identify which files to be replicated.This is achieved by estimating number of accessed of a file in the upcoming time interval.The greater the value, the more popular the file is and therefore will be selected to be replicate

    Tarikan tradisi kad raya

    Get PDF
    Walaupun tradisi menghantar kad raya sempena Aidilfitri semakin kurang mendapat sambutan dalam kalangan masyarakat hari ini, namun masih ada yang menggunakannya untuk mengutus tanda ingatan

    Review on Network Function Virtualization in Information-Centric Networking

    Get PDF
    Network function virtualization (NFV / VNF) and information-centric networking (ICN) are two trending technologies that have attracted expert's attention. NFV is a technique in which network functions (NF) are decoupling from commodity hardware to run on to create virtual communication services. The virtualized class nodes can bring several advantages such as reduce Operating Expenses (OPEX) and Capital Expenses (CAPEX). On the other hand, ICN is a technique that breaks the host-centric paradigm and shifts the focus to 'named information' or content-centric. ICN provides highly efficient content retrieval network architecture where popular contents are cached to minimize duplicate transmissions and allow mobile users to access popular contents from caches of network gateways. This paper investigates the implementation of NFV in ICN. Besides, reviewing and discussing the weaknesses and strengths of each architecture in a critical analysis manner of both network architectures. Eventually, highlighted the current issues and future challenges of both architectures. © 2021 IEEE

    An Experimental Investigation the Optimum of Salinity and Ph of Sea-Water to Improve Oil Recovery from Sandstone Reservoir as A Secondary Recovery Process

    Get PDF
    Laboratory tests and field applications shows that the salinity of water flooding could lead to significant reduction of residual oil saturation. There has been a growing interest with an increasing number of low-salinity water flooding studies. However, there are few quantitative studies on seawater composition change and it impact on increasing or improving oil recovery.  This study was conducted to investigate only two parameters of the seawater (Salinity and pH) to check their impact on oil recovery, and what is the optimum amount of salinity and ph that we can use to get the maximum oil recovery.  Several core flooding experiments were conducted using sandstone by inject seawater (high, low salinity and different pH). The results of this study has been shown that the oil recovery increases as the injected water salinity down to 6500 ppm and when the pH is around 7. This increase has been found to be supported by an increase in the permeability. We also noticed that the impact of ph on oil recovery is low when the pH is less than 7

    Comparison of Different Gases Injection Techniques for Better Oil Productivity

    Get PDF
    There are many known enhanced oil recovery (EOR) methods and every method has its criteria to use it. Some of those methods are gas injection such as CO2 injection, N2 and hydrocarbon gas injection. Where the CO2 has been the largest contributor to global EOR. Gas injection can be classified into two main types; continues gas injection (CGI) and water alternating gas injection (WAG). The objective of this research is to propose initial gases injection plan of the X field to maximize the total oil recovery. The feasibility study of different gases to maintain pressure and optimize oil recovery have been examined on a simple mechanistic reservoir model of considerably depleted saturated oil reservoir. In order to maximize the total oil recovery, the simulation study was conducted on 3-phase compositional simulation model. For more optimization, a sensitivity study was conducted on the injection cycling and component ratios. A sensitivity study was also conducted on the following parameters to study their effects on the overall field’s recovery such as flow rate and bottom-hole pressure. Results obtained in this paper shows that, the WAG CO2 injection was found to be significantly more efficient than different gas injection and continues gas injection. The oil recovery depends not only on the fluid-to-fluid displacement but also on the compositional phase behavior.

    Replica maintenance strategy for data grid

    Get PDF
    Data Grid is an infrastructure that manages huge amount of data files, and provides intensive computational resources across geographically distributed collaboration.Increasing the performance of such system can be achieved by improving the overall resource usage, which includes network and storage resources.Improving network resource usage is achieved by good utilization of network bandwidth that is considered as an important factor affecting job execution time.Meanwhile, improving storage resource usage is achieved by good utilization of storage space usage. Data replication is one of the methods used to improve the performance of data access in distributed systems by replicating multiple copies of data files in the distributed sites.Having distributed the replicas to various locations, they need to be monitored.As a result of dynamic changes in the data grid environment, some of the replicas need to be relocated.In this paper we proposed a maintenance replica placement strategy termed as Unwanted Replica Deletion Strategy (URDS) as a part of Replica maintenance service.The main purpose of the proposed strategy is to find the placement of unwanted replicas to be deleted.OptorSim is used to evaluate the performance of the proposed strategy. The simulation results show that URDS requires less execution time and consumes less network usage and has a best utilization of storage space usage compared to existing approaches
    corecore